Consistency Of Restricted Maximum Likelihood Estimators Of Principal Components
نویسندگان
چکیده
In this paper we consider two closely related problems : estimation of eigenvalues and eigenfunctions of the covariance kernel of functional data based on (possibly) irregular measurements, and the problem of estimating the eigenvalues and eigenvectors of the covariance matrix for high-dimensional Gaussian vectors. In [23], a restricted maximum likelihood (REML) approach has been developed to deal with the first problem. In this paper, we establish consistency and derive rate of convergence of the REML estimator for the functional data case, under appropriate smoothness conditions. Moreover, we prove that when the number of measurements per sample curve is bounded, under squared-error loss, the rate of convergence of the REML estimators of eigenfunctions is near-optimal. In the case of Gaussian vectors, asymptotic consistency and an efficient score representation of the estimators are obtained under the assumption that the effective dimension grows at a rate slower than the sample size. These results are derived through an explicit utilization of the intrinsic geometry of the parameter space, which is non-Euclidean. Moreover, the results derived in this paper suggest an asymptotic equivalence between the inference on functional data with dense measurements and that of the high dimensional Gaussian vectors.
منابع مشابه
Consistency of restricted maximum likelihood estimators of principal components Running Title: Consistency of REML estimators
In this paper we consider two closely related problems : estimation of eigenvalues and eigenfunctions of the covariance kernel of functional data based on (possibly) irregular measurements, and the problem of estimating the eigenvalues and eigenvectors of the covariance matrix for high-dimensional Gaussian vectors. In Peng and Paul (2007), a restricted maximum likelihood (REML) approach has bee...
متن کاملConsistency of Restricted Maximum Likelihood Estimators of Principal Components by Debashis Paul1 And
In this paper we consider two closely related problems: estimation of eigenvalues and eigenfunctions of the covariance kernel of functional data based on (possibly) irregular measurements, and the problem of estimating the eigenvalues and eigenvectors of the covariance matrix for highdimensional Gaussian vectors. In [A geometric approach to maximum likelihood estimation of covariance kernel fro...
متن کاملConsistency of Semiparametric Maximum Likelihood Estimators for Two-Phase Sampling
Semiparametric maximum likelihood estimators have recently been proposed for a class of two-phase, outcome-dependent sampling models. All of them were "restricted" maximum likelihood estimators, in the sense that the maximization is carried out only over distributions concentrated on the observed values of the covariate vectors. In this paper, the authors give conditions for consistency of thes...
متن کاملConsistency of Semiparametric Maximum Likelihood Estimators for Two-Phase, Outcome Dependent Sampling
Semiparametric maximum likelihood estimators have recently been proposed for a class of two-phase, outcome dependent sampling models; e.g. Breslow and Holubkov (1997), Scott and Wild (1998), and Lawless, Wild, and Kalb eisch (1999). The estimators studied by these authors are predicated on the estimates of the underlying covariate distribution being concentrated on the observed covariate values...
متن کاملStatistical Analysis of Factor Models of High Dimension
This paper considers the maximum likelihood estimation of factor models of high dimension, where the number of variables (N) is comparable with or even greater than the number of observations (T ). An inferential theory is developed. We establish not only consistency but also the rate of convergence and the limiting distributions. Five different sets of identification conditions are considered....
متن کامل